This paper introduces a deep-learning approach to photographic style transferthat handles a large variety of image content while faithfully transferring thereference style. Our approach builds upon the recent work on painterly transferthat separates style from the content of an image by considering differentlayers of a neural network. However, as is, this approach is not suitable forphotorealistic style transfer. Even when both the input and reference imagesare photographs, the output still exhibits distortions reminiscent of apainting. Our contribution is to constrain the transformation from the input tothe output to be locally affine in colorspace, and to express this constraintas a custom fully differentiable energy term. We show that this approachsuccessfully suppresses distortion and yields satisfying photorealistic styletransfers in a broad variety of scenarios, including transfer of the time ofday, weather, season, and artistic edits.
展开▼